Learning unbelievable probabilities
نویسندگان
چکیده
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we show that many probability distributions have marginals that cannot be reached by belief propagation using any set of model parameters or any learning algorithm. We call such marginals 'unbelievable.' This problem occurs whenever the Hessian of the Bethe free energy is not positive-definite at the target marginals. All learning algorithms for belief propagation necessarily fail in these cases, producing beliefs or sets of beliefs that may even be worse than the pre-learning approximation. We then show that averaging inaccurate beliefs, each obtained from belief propagation using model parameters perturbed about some learned mean values, can achieve the unbelievable marginals.
منابع مشابه
Learning unbelievable marginal probabilities
Loopy belief propagation performs approximate inference on graphical models with loops. One might hope to compensate for the approximation by adjusting model parameters. Learning algorithms for this purpose have been explored previously, and the claim has been made that every set of locally consistent marginals can arise from belief propagation run on a graphical model. On the contrary, here we...
متن کاملCooperative Replies to Unbelievable Assertions - A Dialogue Protocol based on Logical Interpolation
We propose a dialogue protocol for situations in which an agent makes to another agent an assertion that the other agent finds impossible to believe. In this interaction, unbelievable assertions are rejected using explanations formed by logical interpolation and new assertions are being made such that all previous rebuttals are taken into account.
متن کاملAn Introduction to Inference and Learning in Bayesian Networks
Bayesian networks (BNs) are modern tools for modeling phenomena in dynamic and static systems and are used in different subjects such as disease diagnosis, weather forecasting, decision making and clustering. A BN is a graphical-probabilistic model which represents causal relations among random variables and consists of a directed acyclic graph and a set of conditional probabilities. Structure...
متن کاملAn Unbelievable Foreign Body in a Maxillary Sinus
Introduction: Misdiagnosis and the resulting mismanagement are challenging issues in complicated cases which present with obscure complaints. Interpreting radiologic studies, especially conventional plain radiologic images, remains the most frequently prescribed and useful modality for the first step of assessment. Case Presentation: In this report, we present...
متن کاملRobust Conditional Probabilities
Conditional probabilities are a core concept in machine learning. For ex-ample, optimal prediction of a label Y given an inputX corresponds to maximizingthe conditional probability of Y given X . A common approach to inference tasksis learning a model of conditional probabilities. However, these models are oftenbased on strong assumptions (e.g., log-linear models), and hence the...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Advances in neural information processing systems
دوره 24 شماره
صفحات -
تاریخ انتشار 2011